Reviewer:C J Mramer
October 25, 2018 Subject:
The Desperate Need for Conceptual Analysis in AI
"Our ultimate objective is to make programs that learn from their experience as effectively as humans do." John McCarthy, "Programs with Common Sense," p. 404. What does it mean for a human being to 'learn from experience'? First, take the word 'experience'. If we take it to mean the arousal of an organism's internal or external receptor neurons resulting in some measurable change in the distribution of electrochemical energy in the brain, it is hard to see how this might have any analogical counterpart in a computer. Second, take the word "learn' or the verb phrase 'to learn'. What does it mean for a human being to learn something? From a strictly behaviorist point of view, which is the only one that makes sense to me when we're discussing the overt, observable actions of organisms, we usually say an organism has learned something when it responds differently to the same or very similar stimulus on temporally distinct occasions. Again, what would count as 'stimulus' in a computer and what would be 'response'? Are the words 'input' and 'output' operationally equivalent? If so, what is the role of the programmer in this model? Finally, the conceptual constant 'person' is undefined in McCarthy's modal system of situations and causes (cf. pp. 412-415). Until we have a better understanding of that concept, the AI project will generate unwarranted confusion and fear.